Abstract: For social robots to be successfully integrated and accepted within society, they need to be able to interpret human social cues that are displayed through natural modes of communication. Service robots directly interact with people, so finding a more natural and easy user interface is of fundamental importance. In particular, a key challenge in the design of social robots is developing the robot’s ability to recognize a person’s affective states (emotions, moods, and attitudes) in order to respond appropriately during social Human–Robot Interactions (HRIs). In this project development of a social robot which will be able to autonomously determine a person’s degree of accessibility is proposed. The work will determine the performance of our automated system in being able to recognize and classify a person’s accessibility levels and investigation of how people interact with an accessibility-aware robot which determines its own behaviours based on a person’s accessibility levels.
Keywords: Rehabilitation, Recognizes a body gesture, Face expressions, Modelling of body gestures.